View in #chat on Slack
@expansivemango: i tossed out a question a few weeks back about using GitLFS and had some follow up questions
is there an upper limit to how large a repository could be? do large repositories (150Gb) need to be broken up into smaller repositories?
@spiderspy: not if you are paying I think but I am really not sure
riding the free way over here 
@expansivemango: i think it’s installed on one of our org’s servers…
@spiderspy: free limit is something like 2 gig LFS I think 
@expansivemango: it should theoretically hold as much as you can fit disks into a rack right?
do you have limitations with how large a single file can be?
@bob.w: With LFS, it should be disk size as the upper bound. The repo itself doesn’t actually have those files in it, instead the repo keeps some marker files that point to the actual location on disk.
This way you can clone a repo without having to get every iteration of the LFS tracked files
@expansivemango: so if we’re seeing issues with the repository it could just be the objects/hooks that grew too large and we just have to run house cleaning tools on those??
…that the repository itself exceeded a threshold, or are you saying that the repository (just what’s in the .git folder) doesn’t have a threshold?
@bob.w: a git repo can get slow if there are a large number of files, and a fairly large history. especially on windows
The remote itself could get pretty slow if you’ve got a lot of people accessing it, but that depends on hardware.
@expansivemango: kewl… thanks for your help