[Date Prev][Date Next] [Thread Prev][Thread Next] [Date Index] [Thread Index]

Re: Package Pool Proposal



Jason Gunthorpe wrote:
> > With a mere 111 buckets. Seems reasonable. Otoh, if you don't subdivide x
> > and g, you get:
> >     382 x
> >     347 g
> > In just 56 buckets. Not significantly worse dir size, and a much easier
> > special case to remember.
> 
> I like that, so that would be using a largest prefix match set of
> {[a-z],lib[a-z]}, only the single special case in the hash function.

Of course, amusingly, I think we're back to Guy's original idea. :-)

> > And it's acceptably fast for random single file accesses.)
> 
> Mirroring tools become slower and consume more memory the larger the
> single directory. Primarily because they do need to stat every file and
> they need to perform an in-memory diff of the directory contents. 

I wasn't suggesting debian do this. As I said, I use it for single file
access at random -- ie, if I want to look inside a random .deb, I use that
directory which is check full of symlinks and save a few seconds..

-- 
see shy jo


Reply to: