Re: Beagle and NFS



On 2/21/06, Joe Shaw <joeshaw novell com> wrote:
> Hi,
>
> On Tue, 2006-02-21 at 15:13 -0500, Kevin Kubasik wrote:
> > When I get home, I have a rough (not quite working yet) python script
> > to listen to fam and update a simple text file with directory info, do
> > the beagle python bindings word that direction as well?
>
> I'm not sure FAM will work; we need the granularity of inotify's events
> (ie, attributes changed vs. file changed).
>
> I'm not sure I understand what you're asking about the beagle python
> bindings.

My apologies, I was wondering if the bindings could allow the script
to almost act as a queriable and pass indexing information 'upstream'
in a sense.

> > More importantly, can beagle-build-index be used to update a index
> > without a complete crawl? For example, I index
> > /mnt/home/kevin
> >
> > my little script notices a change at
> > /mnt/home/kevin/src/beagle
> >
> > can I get beagle-build-index to update the index w/o a complete
> > recrawl? Since beagle-build-index has no built in scheduling features,
> > thats a huge lock-up for the homedir if not.
>
> Right now, beagle-build-index does a complete recrawl, but it only
> reindexes things that have been updated.  I think that just specifying
> subdirectories or maybe even files to it will work, but if not this will
> require a little hacking.
>
> > I have to think that an implementation of 3 seems like the best idea,
> > we could cheat on handling automagic discovery and all that, and just
> > use a pipe over ssh for basic communication, and require that the user
> > do the key pairing atm. While a nice plan in the future would just
> > have each user of beagle establish a 'beagle username/passwd pair' and
> > coordinate distributed network searching, that doesn't seem to be in
> > the scope of the project at the moment.
>
> Autodiscovery should be pretty easy to do with avahi and if we just pick
> a random port to listen on, it should be easy to handle multiple users
> as well.  This adds a problem with firewalls, though (although the old
> implementation didn't deal with this either).
>
> As for the actual transport and authentication, I'm not sure what the
> right thing to do here is.  Google Desktop just came out with a new
> version that requires a Google account and appears to upload the indexes
> to Google's servers.  That's an interesting approach and certainly
> solves all of these issues, but obviously makes a lot of people nervous
> about privacy.

As far as implementation goes, centralized services are defiantly a
no-go for us, but I was thinking along the lines of simple
individualized authentication, a simple option in beagle-settings
[x] Allow Sharing
[________________] Addresses to Listen on
[x] Require Authentication
[________________] Username
[________________] Pass

then a separate dialogue would handle registering with remote beagle
services for searching, avahi gives us the host's running beagle and
sharing their indexes easy enough, just let the user click and attempt
to authenticate on each daemon they want to search.

Now snippets/tile generation might provide some issues with
beagle-search display, but assuming we flag all remote results that
way, it could be worked around.


I dunno, just thinking aloud.

Cheers,
Kevin Kubasik


[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]