|
From: Jim E. <Jim...@al...> - 2001-10-31 18:15:17
|
Michael, A few things are done using GLlists but none of the important things are. I'll recently experimented with using GLlists for all the graphics objects in vis5d+, but on my system I didn't see any real performance improvement and it causes several other vis5d features to be shortcircuted. I recently merged the vis5d+ code with the NCAR 5.2 code (which must be what you are using), in doing so I threw out the GLlist experiment rather than trying to merge it. I still have the code in CVS if you want to experiment with it. Jim http://vis5d.sf.net Michael Redmond wrote: > We put the Vis 5D 5.2 version on an HP Visualize Center (uses HPUX 11) with > success, including stereo viewing. We are now working with HP to get it > working on an SV6 and we are trying to push the limits of that system (16+ > cpu parallel rendering engine). I'd love to get my hands on some 2G vis5d > files for testing. > > One issue we run into with parallel rendering is whether OpenGL is rendered > in immediate mode or if everything is in display lists that can be > decomposed. To get performance, the Visualize Center and the SV6 work best > with display lists. Bill or ?...which method is used in Vis5d 5.2? We would > find out quickly with a large data set, but so far we are only using the > typical LAMPS scale datasets that run pretty well. > > HP may demonstrate Vis5d on an SV6 at SC 2001 if they can present something > compelling. If someone is willing to let them (and us) test a large > dataset, please contact me. > > Thanks > Mike Redmond > Associate Director, eMedia Center > > --- > > At 10:47 PM 10/3/2001 -0600, Don Middleton wrote: > >We routinely create and visualize >2GB files with Vis5D. Long ago, I added > >explicit code to the file handling module to accomodate this. Since then, > >Jeff Boote has integrated our stereo3D, largefile, and VRML support into the > >5.2 release. I'm pretty sure that large files are handled just fine in here > >(differently, using Posix contstructs instead of SGI extensions, I think), > >but I'm cc'ing Jeff on this 'cause I haven't tried it myself. The software > >is available at: > > > > http://www.scd.ucar.edu/vg/SoftwareSystems.html > > > >We have a primarily SGI complex, dunno about special Linux configs. > > > >cheers - don > > > >--- > >Don Middleton > >Head, Visualization & Enabling Technologies > >Scientific Computing Division > >National Center for Atmospheric Research > >http://www.scd.ucar.edu/vets > >PO Box 3000; Boulder, Co. 80307-3000 > >Voice:303-497-1250 Cell:303-589-5865 FAX:303-497-1286 > > > >----- Original Message ----- > >From: "Dan McCormick" <Mo...@ho...> > >To: "Leigh Orf" <or...@ma...> > >Cc: <vis...@ss...> > >Sent: Wednesday, October 03, 2001 9:01 PM > >Subject: Re: 2 GB file size limit > > > > > > > Leigh, > > > > > > The routines in question are not integral to Vis5d - they are generic > >FORTRAN > > > (or C, depending on which you are using). Therefore, this is either a > >problem > > > with the OS, the hardware, the file system, or the compiler you're using. > >Even > > > though your file system, hardware, and OS may support 64-bit operations, > >your > > > compiler's i/o routines may be limited to 32-bit file pointer values. > > > > > > You may want to write several files (maybe one for every 30-minute > >period, or > > > one for each variable or two), instead of writing one large file. > > > > > > I hope this helps. > > > > > > Regards, > > > Dan McCormick > > > > > > > > > > > > Leigh Orf wrote: > > > > > > > OK, I should have provided more information. > > > > > > > > This occurs with both Linux, kernel 2.4.10, using the filesystem > > > > Reiserfs (which allows for terrabyte sized files) and IRIX64 6.5 (the > > > > SGI Origin (modi4) at NCSA). This is not a filesystem problem. > > > > > > > > What I am calling modern hardware is what I just got for myself - a > > > > 1.4 GHz Athlon with 1 GB of memory. I do numerical modeling and lots > > > > of visualization/rendering - have been using vis5d since 1990 when I > > > > had to reserve time on a Stardent Stellar machine at SSEC where it was > > > > developed. A two-hour simulation with 6 variables, data every minute, on > > > > a ~ 100x100x60 grid can get big - and I am using a compression factor of > > > > 4 for accuracy. > > > > > > > > Can anyone tell me that they have created vis5d files which are larger > > > > than 2 GB successfully? > > > > > > > > The errors that go to stderr are of the "perhaps disk is full?" variety. > > > > > > > > Leigh Orf > > > > > > > > Glenn Carver wrote: > > > > > > > > | Leigh, > > > > | > > > > | You have probably hit a file size limit on the operating > > > > | system. Various unixes have a 2Gb limit on a single file > > > > | because this represents the maximum integer for a 32bit > > > > | operating system. This limitation is removed in 64bit > > > > | operating systems such as Sun's Solaris 8. > > > > | > > > > | Glenn > > > > > > > > > > > > Janko Hauser wrote: > > > > > > > > | Leigh Orf writes: > > > > | > > > > | > I've run into a problem creating large vis5d files, namely > > > > | > once a vis5d file hits 2 GB the routines to write it fail. > > > > | > > > > > | > Is this an integer overflow problem, or an inherent > > > > | > problem with the vis5d file format? Can it be easily > > > > | > fixed? With modern hardware I can easily visualize really > > > > | > big datasets and the 2 GB file size limit will become a > > > > | > real problem. > > > > | > > > > | Not only to help with your problem it would be quite > > > > | interesting, what you call modern hardware and also qhich OS > > > > | you are using. Or is everywhere hardware with more than 2Gb > > > > | of RAM in common use :-). If this is the case I have some > > > > | more arguments to get new hardware for the department. > > > > | > > > > | __Janko > > > > > > > > > > > > > > > > > > > > > |