Message boards : Number crunching : Harlan is a high level language for general purpose GPU computing
Message board moderation
Author | Message |
---|---|
Send message Joined: 9 Aug 04 Posts: 9 Credit: 8,678 RAC: 0 |
|
Send message Joined: 5 Sep 04 Posts: 7629 Credit: 24,240,330 RAC: 0 |
These climate models from the UK Met Office are written in Fortran. The Met office has no plans to re-write them for gpus. |
Send message Joined: 31 Aug 04 Posts: 391 Credit: 219,896,461 RAC: 649 |
These climate models from the UK Met Office are written in Fortran. Partly, I think, because the FORTRAN climate models have been used, tested, verified and cross-checked for many years. Partly because the climate models are not particularly suited to the gpu paradigm. Partly because re-writing the code, testing, and cross-verifying would cost -- much. And the current GPU's ,and software, are not well enough defined and tested to spend, at least, a few lifetimes of skilled labor for a merely probable gain. |
Send message Joined: 6 Aug 04 Posts: 264 Credit: 965,476 RAC: 0 |
Both SETI@home and Einstein@home use GPUs which can give a ten fold increase in speed. But there is a number of caveats. You need one application if you use a nVidia board and another if you use an ATI/AMD board. These application are produced by volunteer programmers, at least in the SETI case and you must have an app_info.xml file in your project directory to download them. And you must have the correct drivers for your board, which are upgraded very frequently by their vendors. If you don't you will produce a lot of invalid results, so the workunits must be resent to other volunteers, which puts strain on the servers. So GPUs are a mixed blessing. I don't use them. Tullio |
Send message Joined: 13 Jan 06 Posts: 1498 Credit: 15,613,038 RAC: 0 |
... These application are produced by volunteer programmers, ... I gather that the generation of models that we're running now are roughly a million lines of Fortran each, and the new models (HadGem) which have been intermittently discussed are around 10 million lines each. Probably too much code to reasonably expect volunteers to migrate... (and in any case, the source code is strictly controlled by the Met office's Hadley Centre). Can a GPU cope with a task of that complexity? I'm a volunteer and my views are my own. News and Announcements and FAQ |
Send message Joined: 6 Aug 04 Posts: 264 Credit: 965,476 RAC: 0 |
... These application are produced by volunteer programmers, ... The Titan supercomputer, which was the top machine in the top500 list last November and is now surpassed by the Tienhe-2, uses 16000 nVidia Tesla boards as coprocessors. It runs FORTRAN because the Linpack benchmark was written in parallel FORTRAN. I suspect that nVidia is using both SETI@home and Einstein@home as a testing ground for its GPUs. ATI/AMD is giving less support to the OpenCL programming environment, while nVidia uses its proprietary CUDA. Tullio |
Send message Joined: 7 Aug 04 Posts: 2187 Credit: 64,822,615 RAC: 5,275 |
Can a GPU cope with a task of that complexity? The research I've read on utilizing GPUs in weather/climate models seems to see them as being utilized for a specific type of calculations, like radiation computations. They can greatly speed up certain specific calculations, but are not, at this time, suited to the total complexity of these huge models. That may change in the future, but even so, the time frame to make changes by the met office, and then adapt them for cpdn, would be very large. |
©2024 cpdn.org