Questions and Answers : Getting started : THC run?
Message board moderation
Author | Message |
---|---|
Send message Joined: 7 Aug 04 Posts: 4 Credit: 226,618 RAC: 0 |
Is there any particular reason why the THC run is not available for the BOINC beta? Will it be available for BOINC once it goes public on 8/26/04? |
Send message Joined: 5 Aug 04 Posts: 907 Credit: 299,864 RAC: 0 |
the THC run won't be under BOINC. they feel that there are enough runs under the "classic CPDN" for a good ensemble of THC experiments. The next BOINC model will be the HadSM3 with sulphur cycle, and then HadCM3 (the coupled model). |
Send message Joined: 5 Aug 04 Posts: 55 Credit: 87,392 RAC: 0 |
> the THC run won't be under BOINC. they feel that there are enough runs under > the "classic CPDN" for a good ensemble of THC experiments. The next BOINC > model will be the HadSM3 with sulphur cycle, and then HadCM3 (the coupled > model). > > Are these still at the current resolution of 96x73 or are we going to get higher resolution models at the same time? <a href="http://www.users.globalnet.co.uk/~sykesm/cpdn.html"><img src="http://www.users.globalnet.co.uk/~sykesm/gfx/sig.jpg"></a> |
Send message Joined: 5 Aug 04 Posts: 907 Credit: 299,864 RAC: 0 |
> Are these still at the current resolution of 96x73 or are we going to get > higher resolution models at the same time? > <a href="http://www.users.globalnet.co.uk/~sykesm/cpdn.html"><img> src="http://www.users.globalnet.co.uk/~sykesm/gfx/sig.jpg"></a> the sulphur cycle will be 96x73, I think they're debating on the hadcm3, i.e. we could offer the high res one or the low (96/73), or maybe offer both and with BOINC if we get in a CPU with a lot of available RAM and bandwidth and disk space they can get the hi-res one to try (with an appropriate "fraction" for cobblestones to show the difference etc) |
Send message Joined: 5 Aug 04 Posts: 55 Credit: 87,392 RAC: 0 |
> the sulphur cycle will be 96x73, I think they're debating on the hadcm3, i.e. > we could offer the high res one or the low (96/73), or maybe offer both and > with BOINC if we get in a CPU with a lot of available RAM and bandwidth and > disk space they can get the hi-res one to try (with an appropriate "fraction" > for cobblestones to show the difference etc) > So what resolution is 'hi-res', and what sort of machine do you need for a chance to run it? <a href="http://www.users.globalnet.co.uk/~sykesm/cpdn.html"><img src="http://www.users.globalnet.co.uk/~sykesm/gfx/sig.jpg"></a> |
Send message Joined: 5 Aug 04 Posts: 907 Credit: 299,864 RAC: 0 |
there's two higher resolutions, I think it's double each dimension for the highest and 1.5 for the medium (so that's 192x146 and 144x110 respectively). we had an in-house boinc app we were playing with the highest res, it takes about 800MB of RAM and the file output is similarly scaled up! it's neat though, the resolution is high enough that you can see storm fronts and hurricans moving. I suppose someday we could just offer it publicly under boinc and it would only go to those machines that have a gig of ram, P4's etc. |
Send message Joined: 5 Aug 04 Posts: 55 Credit: 87,392 RAC: 0 |
> there's two higher resolutions, I think it's double each dimension for the > highest and 1.5 for the medium (so that's 192x146 and 144x110 respectively). > we had an in-house boinc app we were playing with the highest res, it takes > about 800MB of RAM and the file output is similarly scaled up! it's neat > though, the resolution is high enough that you can see storm fronts and > hurricans moving. I suppose someday we could just offer it publicly under > boinc and it would only go to those machines that have a gig of ram, P4's > etc. > > > Why so much memory? doubling the x and y should quadruple the amount of data from about 80Mb to 320Mb. Or are you doubling the levels? Or did that include a dynamic ocean with 19 levels as well? I like the idea of seeing hurricanes though... <a href="http://www.users.globalnet.co.uk/~sykesm/cpdn.html"><img src="http://www.users.globalnet.co.uk/~sykesm/gfx/sig.jpg"></a> |
Send message Joined: 5 Aug 04 Posts: 1496 Credit: 95,522,203 RAC: 0 |
> there's two higher resolutions, I think it's double each dimension for the > highest and 1.5 for the medium (so that's 192x146 and 144x110 respectively). > we had an in-house boinc app we were playing with the highest res, it takes > about 800MB of RAM and the file output is similarly scaled up! it's neat > though, the resolution is high enough that you can see storm fronts and > hurricans moving. I suppose someday we could just offer it publicly under > boinc and it would only go to those machines that have a gig of ram, P4's > etc. > Sounds interesting, Carl, count me in. (For my P4s, though, it seems prudent to double the existing gig of RAM or watch them thrash themselves to death paging and swapping to support two Models.) What happens to runtime? Jim ________________________________________________ It is impossible to enjoy idling thoroughly unless one has plenty of work to do. -- Jerome K. Jerome (1859, 1927) |
Send message Joined: 5 Aug 04 Posts: 907 Credit: 299,864 RAC: 0 |
oh yeah I think it's 30 levels in the vertical compared to 17 for the current model. I don't quite understand why the memory usage shoots up so much, I assume it's just MetOffice "mojo." The one we tried was a slab ocean too, so a dynamically coupled one would probably be 2GB of RAM! :-) runtime was quite a bit slower as well although that could be that running an 800MB model on a 1GB machine is cutting it close for swap etc. Tolu's got a fun movie of the vis running somewhere though. A grad student (Pardeep) will probably work more on it over the next year and maybe bring it to life under BOINC for "public consumption." |
Send message Joined: 5 Aug 04 Posts: 390 Credit: 2,475,242 RAC: 0 |
> oh yeah I think it's 30 levels in the vertical compared to 17 for the current > model. I don't quite understand why the memory usage shoots up so much, I > assume it's just MetOffice "mojo." The one we tried was a slab ocean too, so > a dynamically coupled one would probably be 2GB of RAM! :-) > > runtime was quite a bit slower as well although that could be that running an > 800MB model on a 1GB machine is cutting it close for swap etc. Tolu's got a > fun movie of the vis running somewhere though. A grad student (Pardeep) will > probably work more on it over the next year and maybe bring it to life under > BOINC for "public consumption." > Yeah, count me in - even for testing. 2GBs RAM are waiting... This is a task for real servers. Carl - is there any update (RSS or whatever) about progress so far - on client side, should we expect new build on public launch etc.? |
Send message Joined: 5 Aug 04 Posts: 55 Credit: 87,392 RAC: 0 |
Has anyone at the MET office or wherever spent any significant time trying to reduce memory or improve speed? I expect when you're running your stuff daily on a Cray you can tend to forget hardware limitations simply because there aren't that many. Just like Windows is now huge because it doesn't need to be small. I'd be interested in trying to find a few code optimisations if the model source is available. As a quick calculation, assuming that the model spends all its time number crunching flat out, every 1% improvement in this area would cut 7 hours off a 30-day model. <a href="http://www.users.globalnet.co.uk/~sykesm/cpdn.html"><img src="http://www.users.globalnet.co.uk/~sykesm/gfx/sig.jpg"></a> |
Send message Joined: 5 Aug 04 Posts: 30 Credit: 39,745 RAC: 0 |
> I'd be interested in trying to find a few code optimisations if the model > source is available. Not that I should comment but I once came across this http://www.cgam.nerc.ac.uk/um/doc/umug/html/index.htm Unified Model User Guide link regarding climate models. I suspect this User Guide is the forerunner of the CPDN HadSM3 climate models and specifically applies to HadCMn models, but I found it quite interesting in trying to figure out the architecture of HadSM3. It would seem that in this instance source is stored on the Cray at the Met office as per Section 3.1. Interestingly enough this reinforces the existence of Fortran plus C source code. I’m presuming that HadSM3 is a natural progression from HadCMn, although I’m prepared to be shot down in flames. I don’t know and would welcome some input, but I feel such a document for HadSM3 would be welcome or in fact confirmation that this HadCMn document applies to HadSM3. Clearly there are many folks out there who are interested in this granularity of detail. <img src="http://boinc.mundayweb.com/cpdn/stats.php?userID=11"> <i> UK4CP @ www.uk4cp.co.uk (United Kingdom Group) Celeron 2.6GHz XP Pro SP1 768MB RAM<i> |
Send message Joined: 5 Aug 04 Posts: 390 Credit: 2,475,242 RAC: 0 |
Thanks for link, Mikey. Chapter 4 discuss the resolution issue. Is it possible to have hybrid resolution: double resolution of ocean (192x146) and normal resolution for atmosphere? Chapter 4 also suggest the horizontal resolution can be adjusted to one needs and computer resources, but vertical resolution (layers) should be chosen more carefully. A compromise needs to be found - larger resolution will also affect dl/ul traffic etc. As i understand from Open Day, we run rather more models with larger parameter space and less amount of detailed models (like Earth Simulator?). |
Send message Joined: 5 Aug 04 Posts: 907 Credit: 299,864 RAC: 0 |
it's pretty tough since there are a number of "mods" that different scientists use to cleanly insert code into spots, so you can optimize one area and somebody may put in a for loop that iterates over millions of elements two lines down. Tolu & I experimented in vectorizing some stuff by hand, mainly the long radiation timestep and were getting big improvements, however the reliability of the results an issue ("on paper" it looked fine but in practice weird things were happening after awhile). |
©2024 cpdn.org