climateprediction.net home page
Little work, yet the most "important" thing in the world?

Little work, yet the most "important" thing in the world?

Message boards : Number crunching : Little work, yet the most "important" thing in the world?
Message board moderation

To post messages, you must log in.

Previous · 1 · 2 · 3 · 4 · 5 · Next

AuthorMessage
Mr. P Hucker

Send message
Joined: 9 Oct 20
Posts: 690
Credit: 4,391,754
RAC: 6,918
Message 63654 - Posted: 10 Mar 2021, 18:20:28 UTC - in response to Message 63644.  
Last modified: 10 Mar 2021, 18:20:54 UTC

If the 3 groups that did work using climate models that were only for Windows, wanted to do that type of research, then that's up to them.
And now they have all disappeared.
And the BOINC work wasn't their main work, just an interesting side line.

It took a lot of work, conferences, meetings, etc, for the Oxford people to get them interested in the first place,
It's sad that seems to have ended, but that doesn't make it the fault of the majority of research centers, who seem to prefer to code in Linux.
I'm not a programmer but.... once you've made it in one OS, isn't "porting" (if that's the right word) to another OS a lot simpler than writing it again? Most projects seem to issue tasks for all OSs, even Android, which can't give them much performance. I can only assume the Linux users are returning the tasks fast enough for CPDN's liking.
ID: 63654 · Report as offensive     Reply Quote
Profile Dave Jackson
Volunteer moderator

Send message
Joined: 15 May 09
Posts: 4535
Credit: 18,966,742
RAC: 21,869
Message 63656 - Posted: 10 Mar 2021, 19:40:20 UTC - in response to Message 63654.  
Last modified: 10 Mar 2021, 19:54:51 UTC

I'm not a programmer but.... once you've made it in one OS, isn't "porting" (if that's the right word) to another OS a lot simpler than writing it again? Most projects seem to issue tasks for all OSs, even Android, which can't give them much performance. I can only assume the Linux users are returning the tasks fast enough for CPDN's liking.

I have done little more than dabble in programming and most of that was before MSDOS never mind Windows existed. I do know however that for some tasks, porting a program between operating systems is trivial. For others that depend on pre-written libraries for whichever operating system is in use, it is far from trivial. Sometimes the library or equivalent does not exist for the OS you want to port to and the programmer may not have the requisite skills to write it for themselves.

Other times, it seems straightforward but despite numerous attempts and many hours looking at code and error messages it proves impossible to find what is causing the problem in one operating system. Both of these scenarios have occurred for CPDN in the past which is why most tasks are for a single OS. The only current exception is the HADCM3 tasks which will run on Mac or Linux. This tends to be easier than Linux-Windows because MacOS is Unix based so has a lot in common with Linux even if they try and dissuade end users from looking under the bonnet.
ID: 63656 · Report as offensive     Reply Quote
Profile JIM

Send message
Joined: 31 Dec 07
Posts: 1152
Credit: 22,363,583
RAC: 5,022
Message 63660 - Posted: 11 Mar 2021, 3:28:28 UTC
Last modified: 11 Mar 2021, 3:35:11 UTC

The truth is that this project has strayed for its original intent. That intent was to allow ordinary people to contribute to scientific research using there ordinary home computers. Those machines overwhelming run Windows. Most people don’t and never will use Linux. It’s just not user friendly enough. If you don’t believe me just read all the posts form Linux user who can’t get this project running on their machines. Windows run this project straight out of the box. The average person isn’t going to spend days hunting down the right compatibility libraries and installing them.

Also, as the models get bigger and more complicated, they are starting to exceed the capabilities of most people’s equipment. The average person isn’t going to buy a bigger, more expensive machine to run climate models. They will just move to other projects or give up.

Sorry about the rant.
ID: 63660 · Report as offensive     Reply Quote
Les Bayliss
Volunteer moderator

Send message
Joined: 5 Sep 04
Posts: 7629
Credit: 24,240,330
RAC: 0
Message 63661 - Posted: 11 Mar 2021, 5:03:45 UTC - in response to Message 63660.  

No, that's a fair comment.

After 5-8 years the Oxford people had done all they could/wanted to, and they looked elsewhere to see if there was a chance to keep things going.
I don't know why the other research groups have stopped, but I know that money was a big factor for the ANZ group.
It took a year for them to luck out and come across a big new storage facility that was just starting and looking for customers.

And I heard some years back, that the person trying to run this in one of the SE Asia centers could only get access to a storage computer for a month at a time.

I don't know what the answer is, if any, or how close the whole project is to closing.
ID: 63661 · Report as offensive     Reply Quote
Profile Dave Jackson
Volunteer moderator

Send message
Joined: 15 May 09
Posts: 4535
Credit: 18,966,742
RAC: 21,869
Message 63662 - Posted: 11 Mar 2021, 6:06:56 UTC - in response to Message 63661.  

Pretty sure part of the problem is that as more is learned about the scientists learn more, the work they are doing has got more complex and that means bigger models that require faster computers with more memory to run in a reasonable time. That many batches are linked to student's PhDs accounts for much of the erratic nature of work availability.

I would question whether it was, "ordinary people" donating their computer time at the start however when tasks on even fast computers would take over six months to complete or not as the case may be.
ID: 63662 · Report as offensive     Reply Quote
Profile JIM

Send message
Joined: 31 Dec 07
Posts: 1152
Credit: 22,363,583
RAC: 5,022
Message 63669 - Posted: 11 Mar 2021, 14:58:33 UTC - in response to Message 63662.  

Pretty sure part of the problem is that as more is learned about the scientists learn more, the work they are doing has got more complex and that means bigger models that require faster computers with more memory to run in a reasonable time. That many batches are linked to student's PhDs accounts for much of the erratic nature of work availability.

I would question whether it was, "ordinary people" donating their computer time at the start however when tasks on even fast computers would take over six months to complete or not as the case may be.


I don’t know about others, but, I was one of those ordinary people running the 160 year models on a single core 1.2 Ghz laptop with 512 MGb ‘s (that’s right Mega not Giga) of ram. They took 3,300 hours to complete. They were a challenge.
ID: 63669 · Report as offensive     Reply Quote
Jean-David Beyer

Send message
Joined: 5 Aug 04
Posts: 1120
Credit: 17,202,915
RAC: 2,154
Message 63670 - Posted: 11 Mar 2021, 16:08:26 UTC - in response to Message 63654.  

I can only assume the Linux users are returning the tasks fast enough for CPDN's liking.


My Linux box returns N216 models in about 7 to 8 days. The lower number (Average turnaround time 6.78 days) is because I have been getting some UK Met Office HadCM3 short v8.36 tasks lately.

https://www.cpdn.org/show_host_detail.php?hostid=1511241
ID: 63670 · Report as offensive     Reply Quote
Mr. P Hucker

Send message
Joined: 9 Oct 20
Posts: 690
Credit: 4,391,754
RAC: 6,918
Message 63672 - Posted: 11 Mar 2021, 19:05:43 UTC - in response to Message 63670.  

I can only assume the Linux users are returning the tasks fast enough for CPDN's liking.


My Linux box returns N216 models in about 7 to 8 days. The lower number (Average turnaround time 6.78 days) is because I have been getting some UK Met Office HadCM3 short v8.36 tasks lately.

https://www.cpdn.org/show_host_detail.php?hostid=1511241

By "fast enough" I meant that all the tasks were being taken from the server queue. Does the Linux queue often run dry? If there's a backlog, using Windows aswell would help. Unless it's difficult or impossible to convert the program.
ID: 63672 · Report as offensive     Reply Quote
SolarSyonyk

Send message
Joined: 7 Sep 16
Posts: 262
Credit: 34,915,412
RAC: 16,463
Message 63675 - Posted: 11 Mar 2021, 21:03:18 UTC - in response to Message 63672.  

The N216 queue has been plenty full for a long while, and getting fuller. The "short" ones are new, and look like an awful lot of work available there too.

There are quite few active users on them, though. I won't argue that Linux takes a bit more work to set up, but given how touchy the HadAM* units are about random shutdowns (I've had to configure a few of my servers to stop BOINC some while before shutting down at night, and others only sleep), I expect there's more than trivial work required to port it to Windows. In addition to the usual "But these take FOREVER!" complaints.

If you're volunteering to help port the code and validate results, reach out. But scientific compute code tends to be a special sort of "touchy and easy to break."
ID: 63675 · Report as offensive     Reply Quote
Les Bayliss
Volunteer moderator

Send message
Joined: 5 Sep 04
Posts: 7629
Credit: 24,240,330
RAC: 0
Message 63676 - Posted: 11 Mar 2021, 21:11:40 UTC

The programs used by WIndows and by Linux are totally different.
One is like a car, and the other a diesel locomotive.
ID: 63676 · Report as offensive     Reply Quote
SolarSyonyk

Send message
Joined: 7 Sep 16
Posts: 262
Credit: 34,915,412
RAC: 16,463
Message 63678 - Posted: 11 Mar 2021, 23:47:08 UTC - in response to Message 63676.  

The programs used by WIndows and by Linux are totally different.
One is like a car, and the other a diesel locomotive.


No, they're really not... at least not for scientific compute. That doesn't mean it's always easy to port stuff, but they read files, allocate memory, run compute bound tasks, and write files. Zero kernel communication outside that.
ID: 63678 · Report as offensive     Reply Quote
Les Bayliss
Volunteer moderator

Send message
Joined: 5 Sep 04
Posts: 7629
Credit: 24,240,330
RAC: 0
Message 63679 - Posted: 12 Mar 2021, 0:20:12 UTC

The Windows programs are Weather At Home 2 (wah2).
The Linux programs are HadAM4.
ID: 63679 · Report as offensive     Reply Quote
Mr. P Hucker

Send message
Joined: 9 Oct 20
Posts: 690
Credit: 4,391,754
RAC: 6,918
Message 63689 - Posted: 13 Mar 2021, 19:42:14 UTC - in response to Message 63676.  

The programs used by WIndows and by Linux are totally different.
One is like a car, and the other a diesel locomotive.
Was that just supposed to mean very different, or are you also implying one is way more advanced/more powerful/sponges more off the taxpayer than the other?
ID: 63689 · Report as offensive     Reply Quote
Profile Iain Inglis
Volunteer moderator

Send message
Joined: 16 Jan 10
Posts: 1084
Credit: 7,798,786
RAC: 5,264
Message 63708 - Posted: 17 Mar 2021, 11:52:46 UTC

The trajectory of increasing model complexity seems quite natural to me. There were global models and then regional ones; slab oceans and then stratified ones; low resolution and then higher resolution; and who knows what internal changes in the science and the parameters.

On the computer side there has not been a compensating application of Moore’s Law. Computers are faster but not fast enough; disks are certainly larger and I would guess large enough; memory is cheaper but oddly constrained by operating system and supplier (for Windows on my Dell, at least); the number of cores has increased but the applications have not exploited that because multi-processor implementations don’t necessarily reduce the time for an ensemble of models to complete even if individual models would finish quicker.

(PS And it does make me laugh when climate-change deniers say that climate scientists are in it for the money when there isn’t any money!)
ID: 63708 · Report as offensive     Reply Quote
Profile Alan K

Send message
Joined: 22 Feb 06
Posts: 491
Credit: 30,945,228
RAC: 13,933
Message 63714 - Posted: 17 Mar 2021, 23:25:12 UTC - in response to Message 63708.  

"the number of cores has increased but the applications have not exploited that because multi-processor implementations don’t necessarily reduce the time for an ensemble of models to complete even if individual models would finish quicker."

Especially for the N216 models that use a lot of L3 cache. For instance my i5 has 3Mb L3 cache for 4 cores. Does a high end Ryzen with 24 cores have 18Mb?
ID: 63714 · Report as offensive     Reply Quote
Profile Dave Jackson
Volunteer moderator

Send message
Joined: 15 May 09
Posts: 4535
Credit: 18,966,742
RAC: 21,869
Message 63717 - Posted: 18 Mar 2021, 6:00:52 UTC - in response to Message 63714.  

"the number of cores has increased but the applications have not exploited that because multi-processor implementations don’t necessarily reduce the time for an ensemble of models to complete even if individual models would finish quicker."

Especially for the N216 models that use a lot of L3 cache. For instance my i5 has 3Mb L3 cache for 4 cores. Does a high end Ryzen with 24 cores have 18Mb?


My Ryzen7 3700X has 32 MB for 8 cores/16threads and if I run 8 n216 tasks it still slows the tasks right down if I run more than 5 at once.
ID: 63717 · Report as offensive     Reply Quote
Jim1348

Send message
Joined: 15 Jan 06
Posts: 637
Credit: 26,751,529
RAC: 653
Message 63719 - Posted: 18 Mar 2021, 19:48:49 UTC - in response to Message 63714.  
Last modified: 18 Mar 2021, 19:53:24 UTC

Does a high end Ryzen with 24 cores have 18Mb?

The Ryzen 3900X has 24 virtual cores (12 full cores) and 64MB L3 cache. That is the same cache as the 3950X with 32 cores, they just don't use the defective cores on the 3900X and can sell it for less, but still keep the larger cache.

I have a couple of each, but curiously the extra cache does not do any good on the N216, as compared with the Ryzen 3600. In fact, the 3600 is better and I can run four N216 without much slowdown versus only two or three on the bigger ones. I think that is because of how the larger cache is accessed on the bigger chips. They have to package two CPUs up together, and get them to share memory or IO or whatever. It works well on some projects but not others.

But yes, a multi-core (multi-threaded) app would be very useful here. It would solve a lot of problems and increase efficiency. But I have never even given it a thought, due to the old code they have to use, and the limited access they have to work with it. Someone in the U.K. Met office, or whoever they are, should get interested in it.
ID: 63719 · Report as offensive     Reply Quote
Profile Alan K

Send message
Joined: 22 Feb 06
Posts: 491
Credit: 30,945,228
RAC: 13,933
Message 63722 - Posted: 18 Mar 2021, 23:09:59 UTC - in response to Message 63719.  

Thanks for the info.
ID: 63722 · Report as offensive     Reply Quote
Mr. P Hucker

Send message
Joined: 9 Oct 20
Posts: 690
Credit: 4,391,754
RAC: 6,918
Message 63735 - Posted: 26 Mar 2021, 18:39:18 UTC - in response to Message 63708.  

The trajectory of increasing model complexity seems quite natural to me. There were global models and then regional ones; slab oceans and then stratified ones; low resolution and then higher resolution; and who knows what internal changes in the science and the parameters.

On the computer side there has not been a compensating application of Moore’s Law. Computers are faster but not fast enough; disks are certainly larger and I would guess large enough; memory is cheaper but oddly constrained by operating system and supplier (for Windows on my Dell, at least);
Really? I just built a Ryzen 9 3900XT. I put 64GB into it and it will take 128GB. Boinc needs nothing like that.

the number of cores has increased but the applications have not exploited that because multi-processor implementations don’t necessarily reduce the time for an ensemble of models to complete even if individual models would finish quicker.

(PS And it does make me laugh when climate-change deniers say that climate scientists are in it for the money when there isn’t any money!)
(PS No, they're just nuts or worry warts). I recycle and use renewables so we don't run out of important things like oil to make plastic, not to stop the climate changing an infinitesimal amount which will be nothing like natural occurences anyway. CO2 is food for plants, there was tonnes of it when the dinosaurs were around and plants flourished. We want that, more crops. If you live next to the sea and drown when the sea level alledgedly rises, well why did you buy a house there? I run this project in the hope the scientists will see the error of their ways. We need to stop over reacting, for example battery cars using all the lithium up! That is not renewable, they can't even recycle it properly yet.
ID: 63735 · Report as offensive     Reply Quote
TR UNESCO Global Geopark

Send message
Joined: 7 Jul 17
Posts: 14
Credit: 112,227,205
RAC: 137,849
Message 63745 - Posted: 30 Mar 2021, 22:09:01 UTC - in response to Message 63735.  

ID: 63745 · Report as offensive     Reply Quote
Previous · 1 · 2 · 3 · 4 · 5 · Next

Message boards : Number crunching : Little work, yet the most "important" thing in the world?

©2024 cpdn.org