This is an excerpt from https://www.theregister.co.uk/2017/04/25/cern_containers_evolving_framework/.
It's not just CERN's software that's evolving — so, too, is the hardware powering it. One factor is the driving nature of CERN's 25 experiments and the volume of data they generate. That volume has nearly doubled in the last few years: an archive of 100PB in 2014 is now 187PB with an extra 49PB from the LHC alone in the last year.
Storage hardware has increased — 12,500 servers, 85,000 disk drives and 24,000 tapes today versus 11,000 servers, 75,000 disk drives with 45,000 tapes in 2014.
Applications and compute that had been running on four massive OpenStack clouds of 3,000 servers and 70,000 cores are today on 7,000 and 220,000 cores with 100,000 added to the compute fabric since last October.
CERN works closely with Intel to find out what's coming in chip giant's planned roadmaps to keep abreast of changes.
As Intel's hardware has changed CERN is now, for the first time, retiring its older servers.
Systems are coming in that deliver greater performance and power efficiency. The new systems are based on Intel E5 2650 processors with 40-48 cores, replacing Intel L5520 with 16-core configurations. CERN is conducting live migrations using OpenStack.
All this as CERN has begun optimising its programs — those experiments expressed in code on colliders like LHC — for parallel processing.
"With multiple simultaneous collisions... trying to work out whether a proton collided here or there is a complicated computing problem and that's where we are seeing an exponential computing growth — data comes in and computing goes up," Bell said.
"There's all kinds of work we need to do on algorithms and the programs — we won't necessarily be meeting all this computing need by putting in more cores. There's a very big effort going on with the experiments to optimise their programming on the environments we have.
"For example using the latest features of processors like vectorisation, getting more running in a parallel stream at any time — that requires a lot of work on the physics code that's been developed over the last 10-20 years, so with that some of the algorithms need to be reworked to be tuned to the latest generation of processors.