GitHub File Size Limits
Repository and file upload limits for GitHub
File Size Warning
GitHub shows warning
Hard Limit
Maximum per file (rejected)
Git LFS
Per file with LFS
GitHub Detailed Limits
| Feature | Free | Pro ($4/mo) | Team ($4/user) | Enterprise |
|---|---|---|---|---|
| File Size Limit | 100 MB | 100 MB | 100 MB | 100 MB |
| Repository Size | ~1 GB recommended | ~1 GB recommended | ~1 GB recommended | ~1 GB recommended |
| Git LFS Storage | 1 GB | 1 GB | 1 GB | Custom |
| Git LFS Bandwidth | 1 GB/month | 1 GB/month | 1 GB/month | Custom |
| Release Assets | 2 GB per file | 2 GB per file | 2 GB per file | 2 GB per file |
| GitHub Actions | 2,000 min/mo | 3,000 min/mo | 3,000 min/mo | 50,000 min/mo |
| Packages Storage | 500 MB | 2 GB | 2 GB | 50 GB |
Git LFS (Large File Storage)
What is Git LFS?
Git LFS (Large File Storage) is an extension that handles large files efficiently. Instead of storing large files in the Git repository, LFS stores pointers while the actual files are stored on remote servers.
- Maximum 2 GB per file
- Unlimited number of files
- 1 GB free storage + bandwidth
- $5/month per 50 GB data pack
When to Use Git LFS
- Media files: Images, videos, audio
- Graphics: PSD, AI, Sketch files
- Datasets: Large CSV, JSON files
- Binaries: Compiled executables
- Archives: ZIP, TAR.GZ files
- 3D models: FBX, OBJ, BLEND
Setup Git LFS
# Install Git LFS
git lfs install
# Track file types
git lfs track "*.psd"
git lfs track "*.zip"
# Commit .gitattributes
git add .gitattributes
git commit -m "Track large files with LFS"
Handling Large Files
1. Use Git LFS
Best for version-controlled large files:
- Install Git LFS extension
- Track specific file patterns
- Automatic for tracked files
- 1 GB free, scalable pricing
2. Use .gitignore
Exclude files from repository:
- Add patterns to .gitignore
- Store files locally only
- Good for build artifacts
- Share via other means
# .gitignore
*.zip
*.mp4
dist/
node_modules/
3. GitHub Releases
Distribute large binaries:
- 2 GB per file limit
- Attach to release tags
- Doesn't count toward repo size
- Perfect for distributions
4. External Storage
Link to cloud services:
- Store on S3, Google Cloud, etc.
- Include download links in README
- Use for massive datasets
- Cost-effective for large files
Best Practices
1. Keep Repositories Small
Optimize repository size:
- Stay under 1 GB if possible
- Use .gitignore for dependencies
- Don't commit build artifacts
- Avoid tracking binary files
- Split large projects into submodules
2. Compress Before Committing
Reduce file sizes:
- Compress images before adding
- Use efficient formats (WebP, AVIF)
- Minify JSON/XML files
- Strip metadata from media files
3. Clean Repository History
Remove accidentally committed large files:
- Use git filter-branch
- Or BFG Repo-Cleaner (faster)
- Rewrite history to remove files
- Force push to update remote
bfg --strip-blobs-bigger-than 50M
4. Separate Data from Code
Architectural approach:
- Code in Git, data elsewhere
- Reference external datasets
- Use DVC for data versioning
- Download data during setup
Quick Tips
⚠️ Warning Signs
- Push/pull taking very long
- Clone size over 1 GB
- Warning about 50 MB files
- Push rejected (100 MB+ file)
- Time to investigate and optimize
📊 Check Repository Size
# On GitHub, go to:
Settings → General
(Shows repository size)
# Locally:
git count-objects -vH
💡 Common Mistakes
- Committing node_modules/
- Tracking compiled binaries
- Adding entire datasets
- Committing .env files
- Use .gitignore to prevent!
Frequently Asked Questions
What happens if I push a file over 100MB?
GitHub will reject the push with error: "File is X MB; this exceeds GitHub's file size limit of 100 MB." You must remove the file from your commit before pushing. If already committed locally, use git filter-branch or BFG Repo-Cleaner to remove it from history.
Is the 1GB repository limit a hard limit?
No, it's a strong recommendation, not a hard limit. Repositories over 1GB work but experience slower clones and performance issues. GitHub may contact you about large repositories. For repos over 5GB, consider splitting into multiple repositories or using Git LFS.
How much does Git LFS cost?
Free tier: 1 GB storage + 1 GB bandwidth per month. Data packs: $5/month for 50 GB storage + 50 GB bandwidth. You can purchase multiple packs. Bandwidth resets monthly. Storage is cumulative. Free tier is sufficient for small projects with a few large files.
Can I use GitHub to host large datasets?
Not recommended. GitHub is optimized for code, not data. For large datasets (>1GB): 1) Use Git LFS (up to 2GB per file). 2) Host on S3/Google Cloud and link in README. 3) Use specialized data platforms like Kaggle or Zenodo. 4) Consider DVC (Data Version Control).
How do I remove a large file from Git history?
Use BFG Repo-Cleaner (easiest): `bfg --strip-blobs-bigger-than 50M`. Or git filter-branch: `git filter-branch --tree-filter 'rm -f path/to/file' HEAD`. Then force push: `git push origin --force --all`. Warning: This rewrites history - coordinate with collaborators.