Message boards :
Cafe CPDN :
The Climate Machine
Message board moderation
Author | Message |
---|---|
Send message Joined: 15 May 09 Posts: 4535 Credit: 18,962,600 RAC: 21,639 |
For those wanting to play with climate modelling while researchers plan the next batches for CPDN try, The Climate Machine complete instructions in the git-hub link. Edit: After precompiling the packages the tests failed for me so back to the drawing board/help |
Send message Joined: 29 Oct 17 Posts: 1048 Credit: 16,390,249 RAC: 15,269 |
For those wanting to play with climate modelling while researchers plan the next batches for CPDN try, The Climate Machine complete instructions in the git-hub link.Thanks for posting this Dave. I've not heard of this model before. I see Julia is the programming language. ECMWF were investigating Julia before I left. It looks some way from a fully fledged climate model though. Looking through the available docs it appears it can only run in various idealized scenarios, not the kind of work CPDN does with the Hadley Centre & ECMWF OpeniFS models. Might be wrong about that though, their documentation is incomplete I would question the use of Julia for the entire model code. It's more efficient to use the right language depending on the purpose. In OpeniFS, the low level computer interface (debugging/tracing/hardware) is handled in C, the number crunching in Fortran (because fortran compilers still produce the fastest code in general) and the upper level control code was written in C++. --- CPDN Visiting Scientist |
Send message Joined: 29 Oct 17 Posts: 1048 Credit: 16,390,249 RAC: 15,269 |
NVidia have got some really interesting work going on in climate & weather in collaboration with weather centres & universities. Lot of it involving machine learning (or AI if you prefer). See info about their Earth digital twin work : https://www.nvidia.com/en-us/high-performance-computing/earth-2/ and follow links to their model 'FourCastNet'. Hybrid models using CPU & GPU with ML are the way things are going. Some nice videos to watch in these links. |
Send message Joined: 15 May 09 Posts: 4535 Credit: 18,962,600 RAC: 21,639 |
NVidia have got some really interesting work going on in climate & weather in collaboration with weather centres & universities. Lot of it involving machine learning (or AI if you prefer). Thanks, I will have a look. At the moment, I am trawling the issues and discussion on git-hub to try and work out what I need to change to get it to work. Edit: Without understanding julia I don't think I stand a chance. Having only ever used ALGOL and BASIC it is likely to be a steep learning curve given that my last use of either was some decades ago! |
Send message Joined: 5 Aug 04 Posts: 1120 Credit: 17,202,915 RAC: 2,154 |
In OpeniFS, the low level computer interface (debugging/tracing/hardware) is handled in C, the number crunching in Fortran (because fortran compilers still produce the fastest code in general) and the upper level control code was written in C++. While I have no intent to contradict you, I wonder if claims like this are actually useful, even if true. At one time, I was working as part of a two-man team to write an assembly-level optimizer for the C compiler. We were given a bunch of benchmarks to optimize, and we got some truly impressive speed-ups. For the famous Whetstone benchmark, supposedly a test of floating point computation, for example, we got over 10,000:1 speedup. This took several parts. Whetstone had several modules and one was thought to be a test of floating point computation because it was called 10,000 times and it did a bunch of floating point operations. That module was actually there to test function and subroutine calling overhead. And we defeated that by expanding the routine in-line. The loop-invariant code motion optimization moved all those floating point operations outside the loop, causing an enormous speed up. Then live-dead analysis noticed the results were never used, so it eliminated the instructions (including the loop overhead) altogether. Marketing was pleased because we could do that benchmark so much better than Motorola (who made a better processor than we did). We had a huge IBM 370 machine that was running UNIX and they gathered a lot of data, so we had them tell us how many processes were run per day and what the programs were that took the most time. nroff/troff (text processor) was the biggest so we ran that through our optimizer and it sped up a little bit (IIRC 10%), but not 10,000:1 or anywhere near. IMAO, it does not matter much how good a compiler is (unless it is really awful), or how good the programming language is. What matters is what the algorithms are and how well the system is programmed. And fixing those is what really matters these days. So the language best used is probably the one with which the programmers are most familiar, and IMAO, FORTRAN is not it. For purely numeric calculation, I preferred Algol-60, but would hesitate to recommend it for CPDN since my guess is that most programmers never even heard of it, and I do not know any compilers for it either. I got to be pretty good at C and C++, but have not written anything in over 20 years, so I am probably nowhere near as good a programmer as I used to be before I retired; |
©2024 cpdn.org