Designing a data center with room to grow

Oct. 1, 1999
A merger of legacy equipment with new systems required a well-designed data center and cable-management systems.

A merger of legacy equipment with new systems required a well-designed data center and cable-management systems.

James Dorsett

Chatsworth Products Inc.

Located between the Pacific Ocean and beautiful Cascade Mountains, Portland-based GE Capital Colonial Pacific Leasing has experienced the kinds of challenges most companies dream of. Continued phenomenal growth over the past several years resulted in the need for a new data center to support not only the company`s burgeoning in-house staff of 300, but also a far-flung network of independent leasing agents.

Colonial Pacific is a company devoted entirely to the leasing of business machines, primarily data-processing and networking equipment. Originally a wholly owned subsidiary of business-machines manufacturing giant Pitney Bowes and subsequently renamed Colonial Pacific Leasing, the firm was recently acquired by General Electric. The nature of the business, however, remains the same--the servicing of organizations ranging from Fortune 500 companies to small and home-based offices as well as "small-ticket" equipment leases valued at up to $1.5 million.

Planning pays off

Colonial Pacific serves customers throughout the United States, in addition to a number of international accounts. In recent years, the company`s annual purchases of new equipment for lease have exceeded $500 million. Actual lease arrangements are handled by independent brokers located in remote sites. These brokers are given access to Colonial Pacific`s corporate local area network to store large volumes of customer information and record the details of all transactions.

Given the scope of the business, it is difficult to imagine the company functioning effectively without a highly centralized data facility. The volume of transactions, the value of the leased properties, and the sheer number of customer records are too large and critical to the operation to permit anything other than a closely managed network center staffed on a continual basis.

"It`s such a complex system," marvels Mark Spencer, project manager at Colonial Pacific and the manager of its new data center. "We`ve been continually upgrading since we occupied the new facility, and we`ve reached the point where every rack is full from top to bottom. Every slot on every router, switch, and server is full as well. We`re constantly adding to the system, but who knows? The company just keeps growing."

One thing Spencer feels he does know is that the basic architecture of the highly centralized network-management headquarters for the company will prove adequate to accommodate any additional hardware over the next several years. The data-center facilities have already provided for the nearly seamless integration of a large base of legacy equipment designed around dumb-terminal emulation with a modern client-server network. The client-server network now extends across the country over leased Integrated Services Digital Network lines into the offices of numerous independent leasing agents serving the company.

Spencer`s confidence does not appear misplaced, for despite the considerable growth that has already occurred during the first few months of operation of the new data center, the physical capacity of the center hasn`t come close to being exhausted--which Spencer attributes to the systemic design and the planning that involved the careful projection of equipment needs before construction began. He also credits the supreme flexibility afforded by the equipment infrastructure and cable-management systems provided by Chatsworth Products Inc. (CPI), a key equipment provider and design consultant on the project.

Easy access to electronics

With steady growth throughout the 1990s, Colonial Pacific has had to update its automated record-keeping system repeatedly to keep pace with its growing customer base. In 1997, company management determined that existing headquarters in Portland did not provide adequate space to house the data-network center and staff the operation required and that new corporate offices were mandated. As a result, management decided to build an entirely new structure to accommodate current equipment needs and provide expansion space for years to come. Floor space within the data center would total more than five times the area of the previous facility.

Before construction commenced, Colonial Pacific secured the services of a data-center design firm, Northwest Information Services (Portland, OR). The design firm worked closely with architect and master building contractor R&H Construction (Portland) to ensure that the physical space of the data center and the electrical systems in the building as a whole would conform to the requirements of the enterprise hub. Actual equipment installation was performed by Allen/Falk (Beaverton, OR), a cabling contractor under the supervision of Northwest Information Services. CPI supplied the rack system to mount the equipment and manage the extensive cable runs. All worked together with Kenny Harrison, manager of information-technology services for Colonial Pacific, to design a space that would truly serve the organization`s needs.

Two cad/cam programs were used to optimize the physical layout: Visio Technical and By Design, the latter a program developed by CPI. Cable-management systems were also provided by CPI. According to Art Rosado, marketing support engineer for Allen/Falk, CPI`s design "allows for interaction between the active electronics and physical infrastructure."

This interaction is evident in the final design, which called for two walls of open floor-to-ceiling racks spanning the length of the room and easily accessible from both front and back. "No need for locks or glass doors," explains Glenn Sexton, president of Northwest Information Services. "The room itself was secured, and it was more important to have all of the equipment readily accessible."

The installation itself, according to Sexton, resembled a military operation. "Complicating the installation was management`s timetable," notes Sexton. "The company was unable to suspend operations during the business day, so the relocation of equipment from the original data center to the new facility had to be accomplished in a single weekend. This made the planning and design stages even more critical to the success of the project. We had a good deal of time to design the system, but very little to put it together."

Once the plans were completed, Allen/Falk did all the cabling and installed the racks in the data center. "No problems there," recalls Rosado. "The rack system is easy to put together. That`s one of the main reasons we use it."

The space allocated to networking equipment--about 1500 square feet in a secure ground-floor facility with a raised floor and restricted access--is quite extensive. Nevertheless, the installation proved challenging due to the very limited amount of time available to complete it and the unusual degree of equipment concentration that it demanded. This one data center houses more than 20 pieces of electronics equipment, including enterprise-class servers, routers, and switches. More than 1500 network nodes, including connections to remote sites, terminate in the same room. Over 300,000 feet of Category 5 cabling snake through the building, and some 277 connections had to be made at the desktop.

The telephone system, based on a Lucent Definity private branch exchange (PBX), shares cabling with the data center and resides in a closed, dedicated space in one corner of the networking center. Extensive crossconnections were required between the PBX and data network, and a large internal phone system added complexity to the installation.

The network is the province of the information-services staff, although it was designed to be administered remotely, as well. Service personnel can access software on individual machines in the midst of hardware maintenance operations.

The central servers and switches form the heart of the system flank workstations. Five Hewlett-Packard 9000 servers lie to the right of the workstations in a tight row. The legacy equipment, as well as the new Compaq 5000 servers, Cisco Catalyst 5500 switches, and Cisco 4000 and 4500 routers, reside on the left side of the room. Legacy equipment also includes HP Netservers.

Merging the legacy with the new

Another factor complicating the installation was the tremendous diversity of components and networking protocols in use. An extensive amount of legacy equipment was moved from the previous headquarters and combined with a volume of new equipment to augment the existing system. The current installed network incorporates Novell and Windows subnetworks and various large HP and Compaq servers, along with the Cisco routers and switches. Several CD-ROM towers and tape backup--the latter allocated to a separate room within the data center--provide additional data storage. 100Base-T switched Ethernet defines the overall physical layer, with switching to the desktop.

As much energy was devoted to wrestling with the physical disposition of the equipment within the dedicated space as was spent on the integration of the diverse operating systems and protocols. The final design placed all networking equipment within the central control space and provided for no auxiliary equipment closets. Home-run wiring was required from each node back to the switches. To manage the profusion of wire, the designer specified structured cabling that would combine three signal carriers in one jacket--two for data and one for telephone. Every desk received all three.

Within the data center itself, cables were taken up through the floor and distributed both vertically and horizontally through accessory distribution rings. "Internal crossconnections between each rack go through the floor," explains Rosado, "while everything directed to the desktop goes through the ceiling." Clusters of cables were terminated at patch bays on the racks and distributed at intervals among the server banks. All cables were color-coded and marked for quick identification and easy manipulation.

Each rack has separate AC power through a dedicated power strip. All AC power lines internal to the data center terminate at a multikilowatt Excide uninterruptible power supply positioned along the wall facing the workstations, which in turn, is supplied with 480-volt three-phase power. "The system is fully backed up and highly redundant," says Sexton. "We weren`t taking any chances."

Due to the open architecture and presence of two enormous Liebert climate-control compressors, forced-air cooling was not required. The heating, ventilation, and air-conditioning system was designed to provide adequate ventilation not only for the existing 20-plus servers, routers, and switches, but also for full occupancy of the entire space allocated for equipment, which will ultimately entail nearly twice its current configuration.

"We`re still evolving," says Colonial Pacific`s Spencer. "We`re moving from the dumb-terminal emulation model to the Internet. Eventually, we`ll probably support videoconferencing to our remote locations." The installation has become something of a showcase in the Pacific Northwest information-services community. And all the while, Colonial Pacific experienced virtually no interruption of service.

Click here to enlarge image

The 1500-square-foot data center in Colonial Pacific Leasing`s Portland office provides critical support to a widely dispersed network of independent sales agents.

Click here to enlarge image

A raised floor facilitates vertical cable runs to the data center`s patch panels. The rack and cable-management systems organize the termination of more than 300,000 feet of Category 5 cabling.

James Dorsett, registered communications systems designer (RCDD), is marketing manager at Chatsworth Products Inc. (Westlake Village, CA). He can be contacted at tel: (818) 735-6100, e-mail: [email protected].

Sponsored Recommendations

imVision® - Industry's Leading Automated Infrastructure Management (AIM) Solution

May 29, 2024
It's hard to manage what you can't see. Read more about how you can get visiability into your connected environment.

Adapt to higher fiber counts

May 29, 2024
Learn more on how new innovations help Data Centers adapt to higher fiber counts.

Going the Distance with Copper

May 29, 2024
CommScopes newest SYSTIMAX 2.0 copper solution is ready to run the distanceand then some.