We needed an ambiguous title for the alternating looks at cabling failure vs. cabling excellence on display in this video slideshow chronicling our best reader-submitted snapshots from the job site:
i.e. -- One can "get into a jam" on a structured cabling site (sometimes just by showing up); or one can "jam" on a cabling job, in the sense of turning in a "rock star" performance. When both conditions obtain within the same job's timeline, it's time for a "before and after" photographic celebration.
However, some of the cable jams most irritating to serious ICT professionals are ones that happen less visibly, away from the back of the rack -- and that leak cold air, especially in data centers.
"Placing data center power and data cables in overhead cable trays instead of under raised floors can result in an energy savings of 24%," contends APC / Schneider Electric's Victor Avelar, who goes on to explain, "Raised floors filled with cabling and other obstructions make it difficult to supply cold air to racks. The raised floor cable cutouts necessary to provide cable access to racks and PDUs result in a cold air leakage of 35%. The cable blockage and air leakage problems lead to the need for increased fan power, oversized cooling units, increased pump power, and lower cooling set points."
Related Product: Improve thermal management practices with CPI’s raised floor grommet
Penned by Avelar, an evergreen white paper (2011) from APC / Schneider Electric highlights these issues and more, and quantifies the energy impact on facilities. The paper's introduction is as follows:
"Although not considered a best practice from an energy efficiency point of view, a common method for cooling data center equipment is to utilize a raised floor as a plenum for the delivery of cold air to the intakes of servers. The cold air is forced underneath the floor by fans within air handlers. However, this method is not the only option.
"Many new data centers today forgo the expense of the raised floor and place equipment on a hard floor. They cool their servers by utilizing in-row, overhead, or room air conditioning with hot aisle containment.
"The hard floor approach also forces the issue of placing cables overhead and many data centers have become accustomed to working with overhead cables. In both cases, the data center owners have to resolve the issue of how to lay out power and data cables.
Related Product: nVent unveils smart row, rack-level precision liquid cooling systems for data centers
"Data centers that depend on raised floor cooling distribution often route network data and power cabling underneath the raised floor. This cabling then feeds individual IT racks through cable cutouts at the back each rack. These cable cutouts allow cold air to bypass the IT server inlets at the front of the racks and mix with the hot air at the back of the rack. This design practice can lead to hot spots, clogged floors, and overall lower cooling system efficiency.
"This paper analyzes the effect of under-floor cabling on cooling and on electrical consumption and concludes that the decision to place network data and power cabling into overhead cable trays can lower cooling fan and pump power consumption by 24%."