Re: Word attachment...on linuxtoday



Hello,

On Wed, 2002-01-16 at 02:28, Michael Meeks wrote:

[...]

> On Tue, 2002-01-15 at 20:55, Martin Sevior wrote:
> > On Sun, 13 Jan 2002, Charles Iliya Krempeaux wrote:
> > > What Free Software (and Open Source Software) really needs, IMO, is
> > > a good vector graphics engine.  Time and time again, you hear developers
> > > wanting to use SVG in their GUIs.  (Myself being one of them.)  (Yes, I
> > > know there is "librsvg", but it is not good enough.)
> > 
> > Yes indeed. The AbiWord team plan to builds such a thing after our 1.0
> > milestone.
> 
> 	You might consider using the already working, and in many ways
> excellent, Free software sodipodi engine:
> 
> 	http://www.gnome.org/gnome-office/sodipodi.shtml
> 
> 	instead of starting from scratch.

Is the sodipodi engine separated out into a library?  (And if so, then
are there any docs about using it... besides the source?)


> > > As far as I know though, no one is developing a good and fast (Free)
> > > vector graphics engine.  The only thing I know about is the SVG
> > > component for Mozilla.  But I don't think this is a general rendering
> > > engine.  I doubt it supports any kind of hardware acceleration or
> > > assembly-language-optimization needed for a real high-performance
> > > system.  (But correct me if I am wrong though.)
> 
> 	Optimization - hardware acceleration eg. must be subservient to feature
> completeness; possibly no-one is doing this hard core optimization
> because they want to have a working and useful application first. As for
> assembly language, algorithmic optimization will get you far more bang
> for your buck than trying to out-guess the compiler in the vast majority
> of circumstances.

That's not always the case though.  You can't just assume your "problem"
is going to become so "large" that Computational-Complexity becomes the
dominant factor.  (For this, when I say "large", I mean that the 
parameters in you various Computational-Complexity equations become
sufficiently big... i.e., they tend towards infinity.)  Alot of times, 
the problem will remain "small" (in terms of the
Computational-Complexity parameters) and things like
assembly-optimization and hardware-acceleration will make a huge
difference.

Not to mention that with hardware-acceleration, you can have high
degrees of "parallelism"...  that you can't achieve with any
algorithm implemented on a CPU.


I agree with you that Computational-Complexity is important.  And
designing your algorithm with this in mind is paramount.  But, you
also need to consider when your problem is "small".  And make "special
conditions" in your algorithm to accommodate this.  (Amoung other things
you should accommodate [with optimizations]... like frequently computed
values, "invariants" in the algorithms, etc.)

See ya

     Charles Iliya Krempeaux
     tnt @ linux.ca
     ckrempea @ alumni.sfu.ca




[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]