This is the mail archive of the cygwin@sourceware.cygnus.com mailing list for the Cygwin project.


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]

RE: performance of cygwin32


Dear all,
	I have been chasing performance problems with our build system which ran on unix and is
Snail like on NT (i.e. it is unusable). I have just discovered more or less the same BUT  think
the problem is the NT networking not gnuwin32. 

To support this I timed ls to local and networked drives - wow the network took 10 times as 
the local drive and dir took around the same ratio (if not more). 

Does anyone know how to asses NT network peformance?

Kevin 

-----Original Message-----
From:	Geoffrey Noer [SMTP:noer@cygnus.com]
Sent:	08 January 1998 05:42
To:	Ed Peschko
Cc:	earnie_boyd@hotmail.com; gnu-win32@cygnus.com
Subject:	Re: performance of cygwin32

Ed Peschko wrote:
> 
> Win32 is slow, but network drives are *glacial*, at least with cygwin32
> tools. I've installed the coolview update (when will it be incorporated
> into the 'vanilla' cygwin32 tools anyways?) and it is going a bit faster.

The "coolview" sources represent a state of the actual Cygnus development
tree as of whenever that was made.  It's made more progress since then;
right now I'm battling race conditions in the signal code...

Right now I can do a complete configure of the compiler tools (including
tcl/tk) in about 20 minutes and build them in an hour 15 min. which is a 
huge improvement performance-wise.  But signal handling isn't quite as
robust as it needs to be...

-- 
Geoffrey Noer
noer@cygnus.com
-
For help on using this list (especially unsubscribing), send a message to
"gnu-win32-request@cygnus.com" with one line of text: "help".


-
For help on using this list (especially unsubscribing), send a message to
"gnu-win32-request@cygnus.com" with one line of text: "help".


Index Nav: [Date Index] [Subject Index] [Author Index] [Thread Index]
Message Nav: [Date Prev] [Date Next] [Thread Prev] [Thread Next]