[OT] End-user uses for x86-64

Ric Werme ewerme at comcast.net
Sun Feb 18 16:00:34 EST 2007


Ben Scott wrote:

>  Such as?  Serious question; I'm at most a very casual student of
>micro-architectures, so I don't know.  I enjoy learning, though.  So
>educate me.  :)

>>   Hehehe.  And Windowz is also sometimes credited for the success of
>> the Pentium.  Does that define it as a killer app?

>  Perhaps.  Indeed, Microsoft bloat has powered quite a bit of
>hardware sales.  Have you seen the recommended system configurations
>for Vista?  Supercomputers modeling the Earth's climate need less
>power.

I'm sure you meant that comment sarcastically, but as long as you are
interested in education, http://www.hpcwire.com/hpc/343276.html from
Feb 2005 says:

  "Because the amount of detail we include in the global coupled climate
  models depends directly on the limitations of supercomputer resources,
  clearly the faster the computers, the more detail we can include and the
  better information we can obtain for climate change," Meehl
  said. "Therefore, advances in global coupled climate modeling depend
  critically on the speed and availability of high end supercomputer
  resources."

  So what kind of computing power is needed to do all of this?

  William Collins, an NCAR expert on the strengths and limits of climate
  models, says that NCAR's Community Climate System Model 3 (CCSM3), a
  computer code that represents the atmosphere, land surface, ocean, sea ice
  and the interactions among these components, requires roughly 1 quadrillion
  floating- point operations to simulate a year of climate. For the IPCC
  report, NCAR simulated roughly 10,800 years of climate -- about 8.5
  wall-clock years of dedicated 24/7 execution on 200 Power4 CPUs.

  In addition, the weather scientists are handling vast amount of data. The
  climate model generates about 10GB of output per simulation
  year. Simulations for the IPCC report generated over 100TB of model
  output. Without the mass storage devices, high-speed networks and massively
  parallel analysis systems, such advances could not be made. Clearly
  supercomputing is vital to this research.

  Vector computers seem to be the systems of choice for this type of research
  as well. The CCSM3 runs on Japan's Earth Simulator and on the new Cray X1 at
  Oak Ridge National Laboratory. Scientists at NCAR say that these machines
  deliver much higher computational throughput than cache-based MPP computers.

A 1980s climate model might be comfortable on a modern PC, but those
ignored some rather basic stuff like mountains and oceans.  Sure,
Microsoft has a well-deserved reputation for bloat, but please make
claims that are supportable to reduce the noise level on this list.

       -Ric Werme


More information about the gnhlug-discuss mailing list