Handover of the new data centre to Abrechnungszentrum Emmendingen

Highlight: online measurement and control of cold water inlet temperatures


Abrechnungszentrum Emmendingen (ARZ) is a billing service provider for socialized health insurers and other social security institutions. Since it handles roughly 8 million patient data records and over 4 billion euros of billing each year, it is no surprise that the reliable execution of data centre activities is a top priority for ARZ's clients. To maintain a high level of reliability, ARZ decided to build a second data centre in 2010. This project, however, was special. For the first time, ARZ paid special attention to ensuring the infrastructure was as energy-efficient as possible. As the general contractor, bit GmbH integrated a special instrumentation and control system that constantly captures and controls potential energy savings.

ARZ supports over 100 clients with its expertise in billing and data management. Every day, it receives, scans and processes documents with up to 180,000 pages of billing information. High-performance scanners compile the data in a largely automated process and send it to a proprietary billing program in order to process this vast volume as swiftly as possible. The document management system is certified to TÜViT's PK-DML standard. As with financial service providers, ARZ's activities are heavily dependent on technology – its current workforce would be simply unable to perform them manually.

An auditor recommended that ARZ set up another data centre to prepare for future requirements and increase its reliability by mirroring the "old" data centre. Once ARZ decided not to lease computing power at a third-party data centre, it hired Litcos (now Rittal) as a specialist designer for the preliminary planning.

Significance of availability for ARZ
One example illustrates the importance of operational reliability and availability for ARZ's data centre. ARZ performs a wide range of invoice verification services for its clients. To ensure prompt payments and obtain cash discounts or favourable interest rates, some clients require settlement and payment within five days of receiving the documents. If the data centre broke down, it would need at least six hours just to restart and resume regular operations. Processing the transactions that piled up during the downtime, however, would take two weeks. The new data centre addresses this eventuality by serving as a backup. ARZ also maximizes availability by transferring data to a remote data centre.

Challenge – delivering a new, fully operational, energy-efficient data centre within a tight schedule
Klaus Scharbach, IT department head and project manager, explained the next steps: "Once we had selected the basic design and cleared up the financing, we requested proposals from general contractors, which we are required to do as a public-law company. We quickly realized that the bid submitted by bit GmbH from Karlstein am Main matched our requirements the best. After talking to bit's other customers in person, we were convinced that they were the right partner for us." Michael Häfele, chief executive officer, added, "First, bit GmbH promised to wrap up the project by the completion date, which was crucial since we were already very far along in the schedule. Also, bit unlocked tremendous savings opportunities, especially in our energy-hungry air-conditioning set-up. This certainly pleased our accountants. When building a data centre, you have to keep an eye on not just the upfront costs, but also the operating costs, as they make up the lion's share of total costs over the years." And so began an innovative data centre infrastructure project that aimed to increase energy efficiency from the very start.

New construction with highly efficient rack air-conditioning
Even the most modern server systems convert the bulk of the electricity they consume into heat. This is why the first step in boosting overall data centre efficiency is to examine the air-conditioning. The ideal solution would carry away heat where it was being produced – in the rack. In the end, the project team opted for a Rittal LCP (Liquid Cooling Package) cooling solution. LCPs were mounted in three rows of six racks each – between the racks and at the ends of the rows – to ensure a reliable, adequate supply of cool air even if one unit were to fail. Each LCP contains air/water heat exchangers, fan modules and a monitoring system. This keeps the racks and cooling system temperature-neutral for the room. The LCP supplies cooling air that is recirculated horizontally in the closed racks and blown into the front of the racks. This arrangement prevents hotspots and the need to size the cooling capacity for an entire room. The room itself was set up as an "IT security cell" inside the new data centre building. This is a high-security server room that protects the hardware from fire and water using special materials and construction methods. In addition, the set-up includes a multi-stage early fire detection system, argon fire suppression system and ventilation systems for air-conditioning. bit also made sure not to cross data and power cables.

Innovative solution – inexpensive cooling
The unusual thing about bit's infrastructure solution is that all the air-conditioning components are monitored by an end-to-end, networked observation system. Energy accounting is essential for green IT – energy costs make up the bulk of operating costs. "We wanted to prevent the internal controllers of the air-conditioning systems from affecting each other adversely, so we developed an entirely new kind of control system," said Gunther ter Bahne, Managing Director of bit GmbH. "Our application aggregates information from all connected systems. Everything comes together in a separate control cabinet over common bus protocols such as SNMP and Modbus and through digital and analogue inputs. An integrated console – which can also be accessed online – analyses this data, shows trends in addition to current operating data, and archives the data for later statistical analyses. If customisable thresholds are breached, the system will respond by taking appropriate action, such as sending fault messages to the building automation system."

One key feature of this solution is its ability to control cold water inlet temperatures based on LCP data. To keep rack temperatures within the desired range, each LCP controls the inlet valve in each module – opening the valve to lower the temperature, closing the valve to raise it. Valve position can be checked via SNMP. "And that's what makes our solution so special," said Michael Botzem, one of bit's programmers. "We look at how open the valve is. If it isn't completely open, we can easily lower the inlet temperature. Essentially, we are lowering the cooling capacity of the recoolers. As the data centre is not completely filled with equipment, we only have to supply as much cooling capacity as is currently needed."

Further components lower energy consumption even more. They include not only rooftop refrigerating compressors to reduce cold water outlet temperatures, but also "free cooling". In regions with a moderate climate, it is possible to refrigerate without compressors for much of the year. The above system for regulating cold water temperatures extends the free cooling season even more. All these steps greatly improve energy efficiency, but bit GmbH has even more tricks up its sleeve. The new data centre building does not have special heating systems for its infrastructure rooms or system management and data centre management offices. Where needed, the radiators receive their warm water from heat pumps fed by cold water that has absorbed heat from the servers. And if there is still heat left over, it is fed into the heating system of the nearby office building. Over the years, this solution will translate into lower operating costs because ARZ will only use as much electricity as absolutely necessary.

Klaus Scharbach is proud to be running a model data centre that not only offers maximum energy efficiency, but was also completed on schedule and has plenty of room to grow into future requirements. "We are extremely pleased by how bit completed the work in the promised timeframe – that certainly isn't a given in the construction industry. The first excavator was on our property in October 2010, and the building shell was finished before Christmas. Despite having several changes made during construction, we were able to accept the facility on 1 July 2011. The people working for bit are consummate professionals that you can count on 100%. So we knew that everything would work properly and moved into the new data centre the very next day, on 2 July."


Facts and figures on ARZ's new facility

  • New construction footprint: 400 square metres
  • Net area of the data centre: 70 square metres
  • Office area: 115 square metres
  • 18 server racks, 15 LCP modules, 3 network racks
  • Technical operating data analysis and air-conditioning control: bit solution based on Visu+
  • Energy supply: separate transformer connected to the mains, main low-voltage distribution board in the data centre, Newave UPS with 3 x 100 kVA of power (backup of 10 minutes) and an additional backup in the form of an 520 kW emergency generator
  • Fully automatic argon fire suppression system in an IT security cell
  • Total construction time from groundbreaking to the start of operations: 9 months
  • Realisation: bit GmbH, Karlstein (general contractor)